Variable selection by lasso-type methods
نویسندگان
چکیده
منابع مشابه
Thresholded Lasso for High Dimensional Variable Selection
Given n noisy samples with p dimensions, where n " p, we show that the multi-step thresholding procedure based on the Lasso – we call it the Thresholded Lasso, can accurately estimate a sparse vector β ∈ R in a linear model Y = Xβ + ", where Xn×p is a design matrix normalized to have column #2-norm √ n, and " ∼ N(0,σIn). We show that under the restricted eigenvalue (RE) condition (BickelRitov-T...
متن کاملRegularizing Lasso: a Consistent Variable Selection Method
Table 1 provides the average computational time (in minutes) for the eight methods under the simulation settings. SIS clearly requires the least computational effort, whereas RLASSO as well as Scout require much longer computational time. But all methods except RLASSO(CLIME) can be computed under a reasonable amount of time for p = 5000 and n = 100. RLASSO(CLIME) takes much longer because of in...
متن کاملPre-Selection in Cluster Lasso Methods for Correlated Variable Selection in High-Dimensional Linear Models
We consider variable selection problems in high dimensional sparse regression models with strongly correlated variables. To handle correlated variables, the concept of clustering or grouping variables and then pursuing model fitting is widely accepted. When the dimension is very high, finding an appropriate group structure is as difficult as the original problem. We propose to use Elastic-net a...
متن کاملThresholded Lasso for high dimensional variable selection and statistical estimation ∗
Given n noisy samples with p dimensions, where n ≪ p, we show that the multi-step thresholding procedure based on the Lasso – we call it the Thresholded Lasso, can accurately estimate a sparse vector β ∈ R in a linear model Y = Xβ + ǫ, where Xn×p is a design matrix normalized to have column l2 norm √ n, and ǫ ∼ N(0, σ2In). We show that under the restricted eigenvalue (RE) condition (Bickel-Rito...
متن کاملImproved Variable Selection with Forward - Lasso Adaptive Shrinkage
Recently, considerable interest has focused on variable selection methods in regression situations where the number of predictors, p, is large relative to the number of observations, n. Two commonly applied variable selection approaches are the Lasso, which computes highly shrunk regression coefficients, and Forward Selection, which uses no shrinkage. We propose a new approach, “Forward-Lasso A...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Pakistan Journal of Statistics and Operation Research
سال: 2011
ISSN: 2220-5810,1816-2711
DOI: 10.18187/pjsor.v7i2-sp.389